Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 281
Filtrar
1.
Toxicol Mech Methods ; : 1-6, 2024 Apr 08.
Artigo em Inglês | MEDLINE | ID: mdl-38572596

RESUMO

Models of toxicity to tadpoles have been developed as single parameters based on special descriptors which are sums of correlation weights, molecular features, and experimental conditions. This information is presented by quasi-SMILES. Fragments of local symmetry (FLS) are involved in the development of the model and the use of FLS correlation weights improves their predictive potential. In addition, the index of ideality correlation (IIC) and correlation intensity index (CII) are compared. These two potential predictive criteria were tested in models built through Monte Carlo optimization. The CII was more effective than IIC for the models considered here.

2.
Environ Int ; 184: 108474, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38350256

RESUMO

Human health risk assessment is historically built upon animal testing, often following Organisation for Economic Co-operation and Development (OECD) test guidelines and exposure assessments. Using combinations of human relevant in vitro models, chemical analysis and computational (in silico) approaches bring advantages compared to animal studies. These include a greater focus on the human species and on molecular mechanisms and kinetics, identification of Adverse Outcome Pathways and downstream Key Events as well as the possibility of addressing susceptible populations and additional endpoints. Much of the advancement and progress made in the Next Generation Risk Assessment (NGRA) have been primarily focused on new approach methodologies (NAMs) and physiologically based kinetic (PBK) modelling without incorporating human biomonitoring (HBM). The integration of toxicokinetics (TK) and PBK modelling is an essential component of NGRA. PBK models are essential for describing in quantitative terms the TK processes with a focus on the effective dose at the expected target site. Furthermore, the need for PBK models is amplified by the increasing scientific and regulatory interest in aggregate and cumulative exposure as well as interactions of chemicals in mixtures. Since incorporating HBM data strengthens approaches and reduces uncertainties in risk assessment, here we elaborate on the integrated use of TK, PBK modelling and HBM in chemical risk assessment highlighting opportunities as well as challenges and limitations. Examples are provided where HBM and TK/PBK modelling can be used in both exposure assessment and hazard characterization shifting from external exposure and animal dose/response assays to animal-free, internal exposure-based NGRA.


Assuntos
Rotas de Resultados Adversos , Modelos Biológicos , Animais , Humanos , Toxicocinética , Monitoramento Biológico , Medição de Risco/métodos
3.
ALTEX ; 41(2): 273-281, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38215352

RESUMO

Both because of the shortcomings of existing risk assessment methodologies, as well as newly available tools to predict hazard and risk with machine learning approaches, there has been an emerging emphasis on probabilistic risk assessment. Increasingly sophisticated AI models can be applied to a plethora of exposure and hazard data to obtain not only predictions for particular endpoints but also to estimate the uncertainty of the risk assessment outcome. This provides the basis for a shift from deterministic to more probabilistic approaches but comes at the cost of an increased complexity of the process as it requires more resources and human expertise. There are still challenges to overcome before a probabilistic paradigm is fully embraced by regulators. Based on an earlier white paper (Maertens et al., 2022), a workshop discussed the prospects, challenges and path forward for implementing such AI-based probabilistic hazard assessment. Moving forward, we will see the transition from categorized into probabilistic and dose-dependent hazard outcomes, the application of internal thresholds of toxicological concern for data-poor substances, the acknowledgement of user-friendly open-source software, a rise in the expertise of toxicologists required to understand and interpret artificial intelligence models, and the honest communication of uncertainty in risk assessment to the public.


Probabilistic risk assessment, initially from engineering, is applied in toxicology to understand chemical-related hazards and their consequences. In toxicology, uncertainties abound ­ unclear molecular events, varied proposed outcomes, and population-level assessments for issues like neurodevelopmental disorders. Establishing links between chemical exposures and diseases, especially rare events like birth defects, often demands extensive studies. Existing methods struggle with subtle effects or those affecting specific groups. Future risk assessments must address developmental disease origins, presenting challenges beyond current capabilities. The intricate nature of many toxicological processes, lack of consensus on mechanisms and outcomes, and the need for nuanced population-level assessments highlight the complexities in understanding and quantifying risks associated with chemical exposures in the field of toxicology.


Assuntos
Inteligência Artificial , Toxicologia , Animais , Humanos , Alternativas aos Testes com Animais , Medição de Risco/métodos , Incerteza , Toxicologia/métodos
4.
Altern Lab Anim ; 52(2): 117-131, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38235727

RESUMO

The first Stakeholder Network Meeting of the EU Horizon 2020-funded ONTOX project was held on 13-14 March 2023, in Brussels, Belgium. The discussion centred around identifying specific challenges, barriers and drivers in relation to the implementation of non-animal new approach methodologies (NAMs) and probabilistic risk assessment (PRA), in order to help address the issues and rank them according to their associated level of difficulty. ONTOX aims to advance the assessment of chemical risk to humans, without the use of animal testing, by developing non-animal NAMs and PRA in line with 21st century toxicity testing principles. Stakeholder groups (regulatory authorities, companies, academia, non-governmental organisations) were identified and invited to participate in a meeting and a survey, by which their current position in relation to the implementation of NAMs and PRA was ascertained, as well as specific challenges and drivers highlighted. The survey analysis revealed areas of agreement and disagreement among stakeholders on topics such as capacity building, sustainability, regulatory acceptance, validation of adverse outcome pathways, acceptance of artificial intelligence (AI) in risk assessment, and guaranteeing consumer safety. The stakeholder network meeting resulted in the identification of barriers, drivers and specific challenges that need to be addressed. Breakout groups discussed topics such as hazard versus risk assessment, future reliance on AI and machine learning, regulatory requirements for industry and sustainability of the ONTOX Hub platform. The outputs from these discussions provided insights for overcoming barriers and leveraging drivers for implementing NAMs and PRA. It was concluded that there is a continued need for stakeholder engagement, including the organisation of a 'hackathon' to tackle challenges, to ensure the successful implementation of NAMs and PRA in chemical risk assessment.


Assuntos
Rotas de Resultados Adversos , Inteligência Artificial , Animais , Humanos , Testes de Toxicidade , Medição de Risco , Bélgica
5.
Toxics ; 11(12)2023 Dec 06.
Artigo em Inglês | MEDLINE | ID: mdl-38133394

RESUMO

The OECD recognizes that data on a compound's ability to treat eye irritation are essential for the assessment of new compounds on the market. In silico models are frequently used to provide information when experimental data are lacking. Semi-correlations, as they are called, can be useful to build up categorical models for eye irritation. Semi-correlations are latent regressions that can be used when the endpoint is expressed by two values: 1 for an active molecule and 0 for an inactive molecule. The regression line is based on the descriptor values which serve to distribute the data into four classes: true positive, true negative, false positive, and false negative. These values are applied to calculate the corresponding statistical criterion for assessing the predictive potential of the categorical model. In our model, the descriptor is the sum of what are termed correlation weights. These are defined by optimization using the Monte Carlo method. The target function of the optimization is related to the determination coefficient and the mean absolute error for the training set. Our model gives results that are better than those previously reported for the same endpoint.

6.
Molecules ; 28(20)2023 Oct 23.
Artigo em Inglês | MEDLINE | ID: mdl-37894710

RESUMO

Data on Henry's law constants make it possible to systematize geochemical conditions affecting atmosphere status and consequently triggering climate changes. The constants of Henry's law are desired for assessing the processes related to atmospheric contaminations caused by pollutants. The most important are those that are capable of long-term movements over long distances. This ability is closely related to the values of Henry's law constants. Chemical changes in gaseous mixtures affect the fate of atmospheric pollutants and ecology, climate, and human health. Since the number of organic compounds present in the atmosphere is extremely large, it is desirable to develop models suitable for predictions for the large pool of organic molecules that may be present in the atmosphere. Here, we report the development of such a model for Henry's law constants predictions of 29,439 compounds using the CORAL software (2023). The statistical quality of the model is characterized by the value of the coefficient of determination for the training and validation sets of about 0.81 (on average).

7.
Artigo em Inglês | MEDLINE | ID: mdl-37770141

RESUMO

Most quantitative structure-property/activity relationships (QSPRs/QSARs) techniques involve using different programs separately for generating molecular descriptors and separately for building models based on available descriptors. Here, the capabilities of the CORAL program are evaluated. A user of the program should apply as the basis for models the representation of the molecular structure by means of the simplified molecular input-line entry system (SMILES) as well as experimental data on the endpoint of interest. The local symmetry of SMILES is a novel composition of symmetrically represented symbols, which are three 'xyx', four 'xyyx', or five symbols 'xyzyx'. We updated our CORAL software using this optimal, new flexible descriptor, sensitive to the symmetric composition of a specific part of the molecule. Computational experiments have shown that taking account of these attributes of SMILES can improve the predictive potential of models for the mutagenicity of nitroaromatic compounds. In addition, the above computational experiments have confirmed the advantage of using the index of ideality of correlation (IIC) and the correlation intensity index (CII) for Monte Carlo optimization of the correlation weights for various attributes of SMILES, including the local symmetry. The average value of the coefficient of determination for the validation set (five different models) without fragments of local symmetry is 0.8589 ± 0.025, whereas using fragments of local symmetry improves this criterion of the predictive potential up to 0.9055 ± 0.010.

8.
Molecules ; 28(18)2023 Sep 12.
Artigo em Inglês | MEDLINE | ID: mdl-37764363

RESUMO

The assessment of cardiotoxicity is a persistent problem in medicinal chemistry. Quantitative structure-activity relationships (QSAR) are one possible way to build up models for cardiotoxicity. Here, we describe the results obtained with the Monte Carlo technique to develop hybrid optimal descriptors correlated with cardiotoxicity. The predictive potential of the cardiotoxicity models (pIC50, Ki in nM) of piperidine derivatives obtained using this approach provided quite good determination coefficients for the external validation set, in the range of 0.90-0.94. The results were best when applying the so-called correlation intensity index, which improves the predictive potential of a model.


Assuntos
Cardiotoxicidade , Química Farmacêutica , Humanos , Cardiotoxicidade/etiologia , Método de Monte Carlo , Piperidinas , Relação Quantitativa Estrutura-Atividade
10.
J Hazard Mater ; 460: 132358, 2023 10 15.
Artigo em Inglês | MEDLINE | ID: mdl-37634379

RESUMO

We have reported here a quantitative read-across structure-activity relationship (q-RASAR) model for the prediction of binary mixture toxicity (acute contact toxicity) in honey bees. Both the quantitative structure-activity relationship (QSAR) and the similarity-based read-across algorithms are used simultaneously for enhancing the predictability of the model. Several similarity and error-based parameters, obtained from the read-across prediction tool, have been put together with the structural and physicochemical descriptors to develop the final q-RASAR model. The calculated statistical and validation metrics indicate the goodness-of-fit, robustness, and good predictability of the partial least squares (PLS) regression model. Machine learning algorithms like ridge regression, linear support vector machine (SVM), and non-linear SVM have been used to further enhance the predictability of the q-RASAR model. The prediction quality of the q-RASAR models outperforms the previously reported quasi-SMILEs-based QSAR model in terms of external correlation coefficient (Q2F1 SVM q-RASAR: 0.935 vs. Q2VLD QSAR: 0.89). In this research, the toxicity values of several new untested binary mixtures have been predicted with the new models, and the reliability of the PLS predictions has been validated by the prediction reliability indicator tool. The q-RASAR approach can be used as reliable, complementary, and integrative to the conventional experimental approaches of pesticide mixture risk assessment.


Assuntos
Praguicidas , Relação Quantitativa Estrutura-Atividade , Abelhas , Animais , Reprodutibilidade dos Testes , Algoritmos , Aprendizado de Máquina , Praguicidas/toxicidade
11.
Toxics ; 11(7)2023 Jun 30.
Artigo em Inglês | MEDLINE | ID: mdl-37505541

RESUMO

Dimensionality reduction techniques are crucial for enabling deep learning driven quantitative structure-activity relationship (QSAR) models to navigate higher dimensional toxicological spaces, however the use of specific techniques is often arbitrary and poorly explored. Six dimensionality techniques (both linear and non-linear) were hence applied to a higher dimensionality mutagenicity dataset and compared in their ability to power a simple deep learning driven QSAR model, following grid searches for optimal hyperparameter values. It was found that comparatively simpler linear techniques, such as principal component analysis (PCA), were sufficient for enabling optimal QSAR model performances, which indicated that the original dataset was at least approximately linearly separable (in accordance with Cover's theorem). However certain non-linear techniques such as kernel PCA and autoencoders performed at closely comparable levels, while (especially in the case of autoencoders) being more widely applicable to potentially non-linearly separable datasets. Analysis of the chemical space, in terms of XLogP and molecular weight, uncovered that the vast majority of testing data occurred within the defined applicability domain, as well as that certain regions were measurably more problematic and antagonised performances. It was however indicated that certain dimensionality reduction techniques were able to facilitate uniquely beneficial navigations of the chemical space.

12.
Front Toxicol ; 5: 1220998, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37492623

RESUMO

Carcinogenic chemicals, or their metabolites, can be classified as genotoxic or non-genotoxic carcinogens (NGTxCs). Genotoxic compounds induce DNA damage, which can be detected by an established in vitro and in vivo battery of genotoxicity assays. For NGTxCs, DNA is not the primary target, and the possible modes of action (MoA) of NGTxCs are much more diverse than those of genotoxic compounds, and there is no specific in vitro assay for detecting NGTxCs. Therefore, the evaluation of the carcinogenic potential is still dependent on long-term studies in rodents. This 2-year bioassay, mainly applied for testing agrochemicals and pharmaceuticals, is time-consuming, costly and requires very high numbers of animals. More importantly, its relevance for human risk assessment is questionable due to the limited predictivity for human cancer risk, especially with regard to NGTxCs. Thus, there is an urgent need for a transition to new approach methodologies (NAMs), integrating human-relevant in vitro assays and in silico tools that better exploit the current knowledge of the multiple processes involved in carcinogenesis into a modern safety assessment toolbox. Here, we describe an integrative project that aims to use a variety of novel approaches to detect the carcinogenic potential of NGTxCs based on different mechanisms and pathways involved in carcinogenesis. The aim of this project is to contribute suitable assays for the safety assessment toolbox for an efficient and improved, internationally recognized hazard assessment of NGTxCs, and ultimately to contribute to reliable mechanism-based next-generation risk assessment for chemical carcinogens.

13.
J Mol Model ; 29(7): 218, 2023 Jun 29.
Artigo em Inglês | MEDLINE | ID: mdl-37382683

RESUMO

CONTEXT: To apply the quantitative relationships "structure-endpoint" approach, the reliability of prediction is necessary but sometimes challenging to achieve. In this work, an attempt is made to accomplish the reliability of forecasts by creating a set of random partitions of data into training and validation sets, followed by constructing random models. A system of random models for a helpful approach should be self-consistent, giving a similar or at least comparable statistical quality of the predictions for models obtained using different splits of available data into training and validation sets. METHOD: The carried out computer experiments aimed at obtaining blood-brain barrier permeation models showed that, in principle, can be used such an approach (the Monte Carlo optimization of the correlation weights for different molecular features) for the above purpose taking advantage of specific algorithms to optimize the modelling steps with applying of new statistical criteria such as the index of ideality of correlation (IIC) and the correlation intensity index (CII). The results so obtained are good and better than what was reported previously. The suggested approach to validation of models is non-identic to traditionally applied manners of the checking up models. The concept of validation can be used for arbitrary models (not only for models of the blood-brain barrier).


Assuntos
Barreira Hematoencefálica , Compostos Orgânicos , Reprodutibilidade dos Testes , Simulação por Computador , Algoritmos
14.
Int J Mol Sci ; 24(12)2023 Jun 08.
Artigo em Inglês | MEDLINE | ID: mdl-37373049

RESUMO

A sound assessment of in silico models and their applicability domain can support the use of new approach methodologies (NAMs) in chemical risk assessment and requires increasing the users' confidence in this approach. Several approaches have been proposed to evaluate the applicability domain of such models, but their prediction power still needs a thorough assessment. In this context, the VEGA tool capable of assessing the applicability domain of in silico models is examined for a range of toxicological endpoints. The VEGA tool evaluates chemical structures and other features related to the predicted endpoints and is efficient in measuring applicability domain, enabling the user to identify less accurate predictions. This is demonstrated with many models addressing different endpoints, towards toxicity of relevance to human health, ecotoxicological endpoints, environmental fate, physicochemical and toxicokinetic properties, for both regression models and classifiers.


Assuntos
Ecotoxicologia , Humanos , Simulação por Computador , Medição de Risco/métodos
15.
Toxicol In Vitro ; 91: 105629, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-37307858

RESUMO

Mutagenicity is one of the most dangerous properties from the point of view of medicine and ecology. Experimental determination of mutagenicity remains a costly process, which makes it attractive to identify new hazardous compounds based on available experimental data through in silico methods or quantitative structure-activity relationships (QSAR). A system for constructing groups of random models is proposed for comparing various molecular features extracted from SMILES and graphs. For mutagenicity (mutagenicity values were expressed by the logarithm of the number of revertants per nanomole assayed by Salmonella typhimurium TA98-S9 microsomal preparation) models, the Morgan connectivity values are more informative than the comparison of quality for different rings in molecules. The resulting models were tested with the previously proposed model self-consistency system. The average determination coefficient for the validation set is 0.8737 ± 0.0312.


Assuntos
Mutagênicos , Relação Quantitativa Estrutura-Atividade , Humanos , Mutagênicos/toxicidade , Salmonella typhimurium/genética , Modelos Biológicos , Microssomos , Testes de Mutagenicidade
16.
Arch Environ Contam Toxicol ; 84(4): 504-515, 2023 May.
Artigo em Inglês | MEDLINE | ID: mdl-37202557

RESUMO

The traditional application for quantitative structure-property/activity relationships (QSPRs/QSARs) in the fields of thermodynamics, toxicology or drug design is predicting the impact of molecular features using data on the measurable characteristics of substances. However, it is often necessary to evaluate the influence of various exposure conditions and environmental factors, besides the molecular structure. Different enzyme-driven processes lead to the accumulation of metal ions by the worms. Heavy metals are sequestered in these organisms without being released back into the soil. In this study, we propose a novel approach for modeling the absorption of heavy metals, such as mercury and cobalt by worms. The models are based on optimal descriptors calculated for the so-called quasi-SMILES, which incorporate strings of codes reflecting experimental conditions. We modeled the impact on the levels of proteins, hydrocarbons, and lipids in an earthworm's body caused by different combinations of concentrations of heavy metals and exposure time observed over two months of exposure with a measurement interval of 15 days.


Assuntos
Antozoários , Metais Pesados , Oligoquetos , Poluentes do Solo , Animais , Solo/química , Oligoquetos/metabolismo , Antozoários/metabolismo , Poluentes do Solo/análise , Metais Pesados/análise
17.
Toxics ; 11(5)2023 Apr 29.
Artigo em Inglês | MEDLINE | ID: mdl-37235234

RESUMO

Removing a drug-like substance that can cause drug-induced liver injury from the drug discovery process is a significant task for medicinal chemistry. In silico models can facilitate this process. Semi-correlation is an approach to building in silico models representing the prediction in the active (1)-inactive (0) format. The so-called system of self-consistent models has been suggested as an approach for two tasks: (i) building up a model and (ii) estimating its predictive potential. However, this approach has been tested so far for regression models. Here, the approach is applied to building up and estimating a categorical hepatotoxicity model using the CORAL software. This new process yields good results: sensitivity = 0.77, specificity = 0.75, accuracy = 0.76, and Matthew correlation coefficient = 0.51 (all compounds) and sensitivity = 0.83, specificity = 0.81, accuracy = 0.83 and Matthew correlation coefficient = 0.63 (validation set).

18.
Toxics ; 11(4)2023 Mar 23.
Artigo em Inglês | MEDLINE | ID: mdl-37112520

RESUMO

Drug-induced nephrotoxicity is a major cause of kidney dysfunction with potentially fatal consequences. The poor prediction of clinical responses based on preclinical research hampers the development of new pharmaceuticals. This emphasises the need for new methods for earlier and more accurate diagnosis to avoid drug-induced kidney injuries. Computational predictions of drug-induced nephrotoxicity are an attractive approach to facilitate such an assessment and such models could serve as robust and reliable replacements for animal testing. To provide the chemical information for computational prediction, we used the convenient and common SMILES format. We examined several versions of so-called optimal SMILES-based descriptors. We obtained the highest statistical values, considering the specificity, sensitivity and accuracy of the prediction, by applying recently suggested atoms pairs proportions vectors and the index of ideality of correlation, which is a special statistical measure of the predictive potential. Implementation of this tool in the drug development process might lead to safer drugs in the future.

19.
Toxicol Mech Methods ; 33(7): 578-583, 2023 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-36992571

RESUMO

Quantitative structure-property/activity relationships (QSPRs/QSARs) are a tool of modern theoretical and computational chemistry. The self-consistent model system is both a method to build up a group of QSPR/QSAR models and an approach to checking the reliability of these models. Here, a group of models of pesticide toxicity toward Daphnia magna for different distributions into training and test sub-sets is compared. This comparison is the basis for formulating the system of self-consistent models. The so-called index of the ideality of correlation (IIC) has been used to improve the above models' predictive potential of pesticide toxicity. The predictive potential of the suggested models should be classified as high since the average value of the determination coefficient for the validation sets is 0.841, and the dispersion is 0.033 (on all five models). The best model (number 4) has an average determination coefficient of 0.89 for the external validation sets (related to all five splits).


Assuntos
Daphnia , Praguicidas , Animais , Reprodutibilidade dos Testes , Software , Método de Monte Carlo , Relação Quantitativa Estrutura-Atividade , Praguicidas/toxicidade
20.
Molecules ; 28(4)2023 Feb 15.
Artigo em Inglês | MEDLINE | ID: mdl-36838826

RESUMO

The reduction and replacement of in vivo tests have become crucial in terms of resources and animal benefits. The read-across approach reduces the number of substances to be tested, exploiting existing experimental data to predict the properties of untested substances. Currently, several tools have been developed to perform read-across, but other approaches, such as computational workflows, can offer a more flexible and less prescriptive approach. In this paper, we are introducing a workflow to support analogue identification for read-across. The implementation of the workflow was performed using a database of azole chemicals with in vitro toxicity data for human aromatase enzymes. The workflow identified analogues based on three similarities: structural similarity (StrS), metabolic similarity (MtS), and mechanistic similarity (McS). Our results showed how multiple similarity metrics can be combined within a read-across assessment. The use of the similarity based on metabolism and toxicological mechanism improved the predictions in particular for sensitivity. Beyond the results predicting a large population of substances, practical examples illustrate the advantages of the proposed approach.


Assuntos
Aromatase , Substâncias Perigosas , Animais , Humanos , Fluxo de Trabalho , Metabolismo Secundário , Biossíntese Peptídica , Medição de Risco/métodos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...